![]() automated processing head with vision and operation procedure of the head
专利摘要:
AUTOMATED MACHINING HEAD WITH VISION AND RESPECTIVE PROCESS. Automatic head machining process and comprising a forward pressure base provided with side windows can open and close capability involving the machining tool associated with a vertical displacement device equipped with a mechanical lock, a computer view and a Linked computer communications module. The presented invention offers the main advantage of providing an anthropomorphic robot, originally intended for the automotive industry and has relatively low precision, significantly higher machining accuracy, equivalent to much larger precision equipment or parallel kinematic robots, also compensating in real time and continuously, displacement and loss of perpendicularity due to the pressure of the pressure foot, which are common in conventional heads and a source of inaccuracies and errors. 公开号:BR112015028755B1 申请号:R112015028755-7 申请日:2014-05-15 公开日:2021-02-02 发明作者:Jordi ANDUCAS AREGALL;Carlos GANCHEGUI ITURRIA;José Javier Galarza Cambra 申请人:Loxin 2002, S.L; IPC主号:
专利说明:
DESCRIPTIVE REPORT [0001] This description refers, as its title indicates, to an automated machining head with a vision of the type that is used industrially associated with anthropomorphic robot arms to perform various machining tasks, especially drilling and riveting, under the control of a robot controller module, which comprises a pressure base with side windows with opening and closing capacity, involving the machining tool, associated with an axial displacement device with mechanical lock, a vision equipment connected to a computer equipment and a communications module between it and the robot controller module, which allows the vision equipment to interact with the robot controller, all with a characteristic operating procedure. Field of invention [0002] The invention refers mainly but not exclusively to the field of machining heads, especially for drilling and riveting, associated with the arms of anthropomorphic robots. State of the art [0003] Currently, anthropomorphic robots are known and widely used in industry, especially in the automobile industry. They are versatile and relatively economical devices, but the main disadvantages are their lack of rigidity and, consequently, also of precision, which can make mistakes of more than 2 mm, which makes them not applicable for those applications in which the precision requirements are of several higher orders, such as, for example, machining, drilling and riveting applications in the aeronautical industry, where precision of hundredths or thousandths of mm is required. [0004] These precision can be achieved by means of high precision equipment or parallel kinematics machines, but they have the inconvenience of their high cost, originated by the necessary precision technology in their manufacture and by the control technologies. [0005] In the use of anthropomorphic robots, there are applications that improve their accuracy through the use of external measurement systems in which, for example, a laser tracking device detects the position in the space of the robot's head and sends the corresponding orders to correct it, but in addition to the high cost of this equipment, they have the great inconvenience that the visual field between the robot and external measuring equipment always has to be clean, which is a great inconvenience and, in most cases, applications, it is not possible. [0006] Efforts have been made to improve the intrinsic precision of anthropomorphic robots, usually by modifying robots in series to add to the same high-precision secondary encoders on the output axes of the gearboxes that move the axes of the robot arm and usually replacing some cases at the same time the robot controller by a numerical control, managing, in this way, to partially increase its rigidity and improve its precision, but they present the drawbacks of its high economic cost, with which one of the greatest advantages of these is lost robots, maintenance and adjustment problems and spare parts, since they are no longer standard robots or serials from the manufacturer's catalog, creating an extra dependency on the company that modifies the robots from the point of view of the customer or end user . [0007] The patents ES2152171A1 and WO2007108780A2 are also known, which incorporate conventional vision equipment in machine tools, but only to provide a good view of the work area, without being able to increase the accuracy. [0008] Applications of video cameras in robots are also known, as, for example, we find in the patents WO 03064116A2, US 2010286827A1, US 2003144765A1, ES2142239_A1, ES2036909_A1 and CN 101205662, but in the same way as in the previous case their mission is provide a good view of the work area during programming of the robot, without being able to increase the accuracy automatically. [0009] In addition, robots with two cameras are known, as can be seen in the patents CN 101726296 and WO 2006019970A2, but neither do they contribute to improving the accuracy of the robot, but only for the recognition of shapes or objects. [0010] Some procedures are also known to improve the intrinsic precision of anthropomorphic robots without vision equipment, based on purely mechanical elements, such as, for example, that described in US patent 2009018697, in which a mechanical system is used to measure the deviations of the robot when applying additional forces to it, but which present the problem that, when a mechanical slip occurs between the workpiece and the measuring nozzle, it is no longer possible to return to the objective point. Description of the invention [0011] To solve the current problem with regard to precision in machining, by improving the perpendicularity and precision in the movements of the robotic arms, the automated machining head with object view of the present invention was provided, which comprises a pressure base, which involves the machining tool, associated with an axial displacement device to the tool axis equipped with mechanical lock, associated with vision equipment, comprising several video cameras and optionally a laser projector, connected with computer equipment equipped with specific software for three-dimensional control, and a communications module that allows it to interact with the robot controller. The vision equipment will preferably be of the 3D type. [0012] The pressure base is formed by a bell equipped with side windows that allow the view, by the camera or artificial vision cameras, of the work surface through the openings of a bell, when the pressure base is performing the its function, that is, while in the working position. These side windows have closures that prevent chips from escaping during machining, since the pressure base itself incorporates a suction system to evacuate the dust from the chips generated during machining. [0013] The computer equipment is connected, via the communications module, with the robot arm controller module, preferably of an anthropomorphic type, which provides the movements to the machining head, making corrections in the orders of the robot controller module in function of the image received from the video cameras that form the vision equipment, and the calculations and forecasts that it performs. [0014] The robot controller module can be an external numerical control or even the robot controller that the robot manufacturers offer. [0015] This machining head with vision includes a specific operating procedure that allows the cancellation of external forces and the correction of the position. [0016] The cancellation of external forces starts from the known fact that, when a small additional force is applied to its working end or to another part of the set, the robot, due to its very low rigidity, loses its position and the orientation achieved, without its controller being aware of it, and for that reason it will not try to return the robot arm to its initial pose. In this procedure the kinematic information of the robot is used by means of the vision system. This information will allow you to reposition the robot, returning it to the correct position, before using the force that changed its position and orientation. [0017] In the process of canceling forces it has to be taken into account that the robot faces a surface on which it wants to perform an action that is subject to the use of a force that will modify its real position without having indicated this movement directly to the robot controller. This part of the process will perform the following functions: [0018] 1. The robot is positioned in front of the work surface. [0019] 2. The vision equipment reads the surface and its sweat, fixes the exact point of operation on the surface and obtains the spatial coordinates of the robot. [0020] 3. Additional force is applied to the robot, in this case, for example, by the pressure base against the work surface, which causes the robot to lose its position. No robot device informs the robot controller that the position has been lost, since the loss is due to mechanical deformations. [0021] 4. The robot makes a request to the vision equipment so that it can read the surface again and measure the movement that took place. [0022] 5. The vision equipment reads the surface and obtains the movement that exists between the current moment and before applying force. In this way, the device is able to externally detect the existing deviation and indicate to the robot controller how much and how to correct its position to return to the operating point. [0023] The last two steps may or may not be iterative until the robot is able to return the operating point or the residual error is less than a determined value. [0024] As noted earlier, on the one hand robots are not particularly accurate devices with regard to positioning or orientation accuracy and, on the other hand, the application of additional forces, once the robot has reached a certain position, it also changes its orientation beyond the position. However, many of the tasks and operations to be carried out by the head of the robot arm require that the robot has to adopt a certain correct orientation regarding the surface of the part at the work point in order to properly perform the main function for which it was designed. An example of this may be the performance of high precision drilling and milling on an aerodynamic surface in which it is of vital importance to adopt an absolutely “normal” orientation or perpendicular to the workpiece surface at each point of operation. [0025] The system and procedure exposed here allow the robot to recover its original orientation (before applying external forces) assuming that it is sufficiently adequate to perform the function, or that at least it adopts a normal orientation to the surface of the part at the work point. [0026] The orientation correction procedure is analogous to the previously described position correction before the application of additional forces, and can be performed at the same time. Specifically, the functions for redirecting the robot are the same except that: [0027] 2. the vision equipment, when reading the surface, also calculates and remembers the initial orientation of the robot, in case it is desired to recover the same starting orientation, before applying additional external forces. [0028] 4. The robot makes a request to the vision equipment so that it can read the surface again and measure the current orientation. [0029] 5. the vision equipment by reading the surface around the point and the normalization calculations is able to detect how much the robot's orientation is deviated from the original or from the normal with respect to the surface and indicate to the robot how much and how to correct your orientation. [0030] The last two steps may or may not be iterative until the robot can be returned to the desired orientation, having previously established a tolerance for the maximum permitted error of orientation. [0031] In the present invention, due to the fact that the vision system is able to visualize the surface of the part before and while additional external forces are being applied on the robot, by fixing the work point on the part and calculating the orientation with relation to it at the same time, it is possible to eliminate the consequences of said forces by returning the robot to the desired position and orientation [0032] It is known and accepted that the precision of anthropomorphic robots is not an important parameter in the usual use for which they were initially conceived. Its work philosophy is traditionally based on physically taking the robot arm to each of the desired positions and constructing the part program by storing these positions in the robot's memory (in English “teaching”). Normally, operations with this type of robot are of high cadence and with few points (a few dozen at most). It is therefore not important to get a robot to reach a certain XYZ quota in the workload. The interesting thing is that the robot is repetitive, that is, that it more or less always goes to the same place. [0033] This process allows the accuracy of the robot to be almost identical to its repeatability. To achieve this goal, the robot uses an external element, preferably a three-dimensional vision device, to determine the position of elements that it will use as an external reference. [0034] The use of an external reference makes it possible to achieve much greater precision in real time. The robot's kinematics are calculated in real time so that the robot's repeatability and accuracy are very similar. In this way, we can determine the precision with a high resolution since the system can correct the final position to which it has to move or arrive in a plane or straight line. [0035] The referenced process takes place through a minimum of two points to draw a virtual line. In case you want to determine a reference plane, the system will need 3 points as a minimum to calculate it with the same precision. [0036] This part of the process will perform the following functions, for two reference points: • The robot goes to a programmed point without needing to be precise, at this point it expects to find a target that will be used as a reference point. • Vision equipment asks the robot to perform some translation movements around the reference point or target while inspecting that point. • The reference point 1 is determined. • The robot goes to the second programmed point • The vision equipment asks the robot to carry out some translation movements around the reference point or target, while inspecting that point. • The reference point 2 is determined. • The line that creates the reference point 1 and the reference point 2 is determined. • Software corrections are determined to compensate for the mechanical distortions that must be applied to the intermediate points between the reference points or close to the referred trajectory, achieving a positioning error similar to the repetition of the robot. [0037] In the event that it is necessary to move through a plane, it is necessary to go at least to a third target to determine the corrections in that plane. [0038] To perform the position correction, it is necessary that the artificial vision equipment can have visual access to the surface of the piece at all times, for it it is necessary a pressure base with openings for the vision inside, once positioned . [0039] The vision equipment allows, in addition, to provide the robot with additional functionalities, such as, for example, the measurement of perpendicularity in real time, the measurement of targets, the measurement of diameters, the quality control of rivets and other . [0040] The present invention is applicable to any type of robot, including anthropomorphic robots, parallel kinematics or other types. Advantages of the invention [0041] The automated machining head with a view that presents multiple advantages over the equipment currently available, the most important being that it can be equipped with an anthropomorphic robot, originally designed for the automobile industry and endowed with a relatively low precision, of notably higher machining accuracy, equivalent to much higher precision equipment, such as machine tools or parallel kinematic type machines. [0042] Another important advantage is that it compensates, in real time and in a continuous way, the decentralization and the loss of perpendicularity by the pressure of the pressure base, which are common in conventional heads and a source of errors and lack of precision. [0043] It is also important to emphasize that, in view of the existing mechanical systems to measure the deviations of the robot when applying additional forces at the same time, there is a great advantage that even if the nozzle skates or slides over the part through the vision system you can always get back to the goal point. [0044] Another additional advantage is that, considering that the slip does not affect it, it is possible to employ higher preload forces on the pressure base or to use more efficient process parameters. [0045] It should also be noted that the vision equipment corrects the positioning points of the robot in real time, interacting with its controller, correcting the errors and inaccuracies of the robot. [0046] The invention contained here achieves that the final precision obtained does not depend on the precision of the robot, but on its repeatability, since it manages to improve the precision taking it to values very close to that of the robot's repeatability, which is often typically around 10 times better than accuracy. [0047] The solution provided here eliminates the need to attach high precision encoders to the output axes of all anthropomorphic robot gearboxes and additional control hardware and software, avoiding modifications on a catalog robot, and which, due to these modifications , your warranty, maintenance and repairs can be changed, using instead a solution made up of three-dimensional vision equipment, a computer system, a communications module and control software, forming a more economical, effective and simple solution . [0048] It should be emphasized especially the advantages that imply that this invention allows an optimization and improvement of the drilling, milling and riveting process, improving the flexibility, productivity and efficiency of the flexible cells, contributing to the innovation of the technique of manufacturing with a noticeable decrease in costs. Description of the Figures [0049] In order to better understand the purpose of the present invention, in the attached plan a preferred practical embodiment of an automated machining head with vision was represented. [0050] In said plan, Figure -1- shows a block diagram of the complete head assembly, the robot, the control computer system, the robot controller module and the communications module. [0051] Figure -2- shows a perspective view of the head. [0052] Figure -3- shows a bottom view and a frontal view of the head. [0053] Figure -4- shows a sectioned side view of the head. [0054] Figure -5- shows a perspective view of part of the head, detailing the vertical displacement device. [0055] Figure -6- shows a perspective view of the depression base. [0056] Figure -7- shows an enlarged plan view and useful calibration profile. Preferred mode of the invention [0057] The head and the automated machining procedure with the object of the present invention are associated with a robot arm (1) to perform various machining tasks, especially drilling and riveting, under the control of a robot controller module (2 ), and basically comprises, as can be seen in the attached plan, a pressure base (3), which involves the machining tool (4), associated with a vertical displacement device (5) equipped with a mechanical lock (6) , a vision equipment, of the 3D type and equipped with at least two video cameras (7), connected with a computer equipment (8) equipped with specific software (9), and a communications module (10). The communications module (10) can be either a specific hardware device or a part of the specific software (9). [0058] It is foreseen that the vision equipment optionally includes a laser device (15) that projects a beam in the shape of a cross inside the pressure base (3). The projection of this cross on the part to be drilled is used by artificial vision cameras to know in which orientation the head is in relation to the part. [0059] The robot controller module (2) can be either an external numerical control or the robot controller itself that the robot manufacturers offer. [0060] The pressure base (3) is formed by a bell, which surrounds the machining tool (4), and equipped with side windows (11) that allow the view, by the video cameras (7), of the machining tool (4) located inside and on its work surface, and the projection of the laser device (15). These side windows (11) of the pressure base (3) have closures (12) that block the view, by the video cameras (7), of the machining tool (4) located inside, preventing the exit of chips during machining. [0061] The closings (12) of the side windows (11) of the pressure base (3) are carried out, in a preferred mode, by means of a second bell (13) concentric with the pressure base (3) and equipped with pivot with respect to it, provided with openings coinciding with the side window elements (11) in an open position and which, by means of a pivot between the second bell (13) and the pressure base (3), in a closed position, the openings do not coincide with the side windows (11), closing the pressure base (3). This second concentric bell (13) can be interior or exterior based on pressure (3). [0062] The computer equipment (8) is connected, via the communications module (10), between the robot controller module (2) and the robot arm (1), making corrections in the orders of the robot controller module (2) depending on the image received from the video cameras (7) that make up the vision equipment. [0063] This machining head with a vision comprises a specific operating procedure that is divided into several phases: a first phase of measurement on the workpiece to be mechanized, a second phase of positioning the head at the objective point of work, a third phase of correction of the head position through vision and a fourth stage of machining or specific operation. [0064] In the first measurement phase, to improve the robot's positioning accuracy, reference points are taken, using the video cameras (7) that form the vision equipment, in the part to be mechanized in the area close to the area to be mechanized , taking a minimum of two points to draw a virtual line or, if you want to determine a reference plane, the system will need a minimum of three points. [0065] For this, in a first step, the reference points are determined. In a second step, the position on the line or plane that creates the previously calculated reference points is determined and, in a third step, a forecast or estimation of the positioning errors that the robot (2) will make when directed to an intermediate point between the references taken and, therefore, the final position can be corrected. [0066] The first step, in which the reference points are determined, comprises the following operations: • Measure the reference point 1, in position 1, with the vision equipment • The robot (2) reposition the new position 1 , now position 2 using the measured data. • The machining head translates / rotates, preferably 10 mm (nm) • Returns to position 2 • Reference point 1 is measured again. • The robot (2) repositiones the new position 2, now position 3 by means of the measured data. • The machining head translates / rotates, preferably 10 mm (nm) • Returns to position 3 • Reference point 1 is measured and stored as a control point. [0067] These operations are repeated to determine each of the reference points. [0068] The second step, in which the positioning on the line or plane that creates the previously calculated reference points is determined, comprises the following operations: • The actual distance between each two reference points is introduced. • It is calculated, using the specific software (9) incorporated in the computer equipment (8), the corrections that must be applied to the intermediate points of the line or plane that the reference points determined by means of the real value of the referred reference points. [0069] The second phase of positioning the head in the zone to be mechanized comprises a first step of displacing the head, by means of the movement of the robot arm (1) ordered by the robot controller module (2), to the coordinates in which it is desired mechanize. [0070] The third stage of correcting the position of the head using vision comprises a first step that takes place in two ways, depending on the type of material or surface to be mechanized: • In the case of normal, not shiny or polished surfaces, take a reference image of the piece using the video cameras (7) that form the vision equipment, through the side windows (11) of the pressure base (3), which will be in its open position, in which, analyzing its roughness by means of the specific software (9) incorporated in the computer equipment (8), it is possible to locate the objective point before applying the forces that deform the robot (2), identifying it by the image of its roughness. • In the case of very shiny or polished surfaces, the head itself makes a small mark or perforation, acting slightly with the machining tool (4) on the objective point on the surface of the part, which will be taken as a reference image by means of video cameras (7) that form the vision equipment, through the side windows (11) of the pressure base (3), which will be in its open position, for the application of additional forces, identifying them by means of the image of that brand as a reference . [0071] The third phase of correcting the position of the head using vision continues with a second step of lowering the pressure base (3) using the vertical displacement device (5), on the surface to be mechanized. This descent, with the consequent force exerted by the pressure base (3) on the zone to be mechanized, produces a displacement of the robot arm (1) which implies a deviation from the originally desired position and orientation, which implies a positioning error. We continue with a third step in which the vision system, comparing the image obtained now through the video cameras (7) that form the vision equipment, through the side windows (11) of the pressure base (3), which they will continue in their open position, with the reference image obtained in the first step and used as a reference, they generate an order of displacement of the robot arm (1) in the desired direction, taking another image of the surface to be mechanized through the windows side (11) of the pressure base (3), repeating this phase until the image matches the reference image, around the operating point, that is, until the coordinates of the current operating point coincide with those fixed in the second phase of positioning the head and the orientation reached coincides with the desired one, which can be the reference obtained in the first step or simply normal to the surface at the point of operation, eliminating the error of deformation and displacement of the base of p resession (3). [0072] The fourth machining phase comprises a first step of mechanical locking (6) of the vertical displacement device (5) of the pressure base (3), a second step of activating the locks (12) of the side windows (11) the pressure base (3) and a third step of the machining tool (4) located inside it to perform machining on the surface. [0073] Optionally, an optional pre-calibration phase can be included, which consists in the use of a useful calibration (14) for the adjustment of the operating parameters of the head, in such a way that, in the calibration process, the correlation between the the 3 coordinate systems: that of the machining tool, that of the vision system and / or that of the robot controller. [0074] The vision equipment also allows to provide the robot arm (1) with additional functionalities, such as, for example, the measurement of perpendicularity in real time, the measurement of targets, the measurement of diameters, the quality control of rivets and others.
权利要求:
Claims (15) [0001] 1. Automated Processing Head with Vision, of the type used industrially associated with robot arms (1), to perform various processing tasks, such as drilling and riveting, controlled by a robot controller module (2), which includes equipment vision equipped with at least two video cameras (7), connected to a computer (8) with specific software (9), and a communications module (10), characterized by comprising: - a machine pedal (3) , which encloses the processing tool (4), - said machine pedal (3) associated with a vertical movement device (5) provided with a mechanical lock (6). [0002] 2. Automated Processing Head with Vision, according to Claim 1, characterized in that the vision equipment comprises a laser device (15) that projects a beam in the shape of a cross. [0003] 3. Automated Processing Head with Vision, according to Claim 1, characterized in that the machine pedal (3) is formed by a protective cover provided with side windows (11) that allows the video cameras (7) to see the processing tool (4) located inside and its working surface. [0004] 4. Automated Processing Head with Vision, according to Claim 3, characterized in that the side windows (11) of the machine pedal (3) have covers (12) that block the view of the video cameras (7) of the machine- tool (4) located inside, preventing chips from escaping during processing. [0005] 5. Automated Processing Head with Vision, 3D, according to Claim 4, characterized in that the covers (12) of the side windows (11) of the machine pedal (3) are achieved by means of a second concentric protective cover ( 13) the machine pedal (3) provided with the ability to rotate in relation to the latter, and provided with openings that coincide with the side windows (11) in an open position, and not coincident in a closed position of the machine pedal (3). [0006] 6. Procedure of Operation of Automated Processing Head with Vision, as defined in the previous Claims, characterized by executing corrections in the commands of the robot controller module (2) according to the image received from the video camera or video cameras (7) which form the vision equipment, and which includes - a first phase of measuring the workpiece to be processed, - a second phase of positioning the head at the target work point, - a third phase of correcting the position and orientation of the head via vision, and - a fourth processing phase or a specific operation for which the device was designed. [0007] 7. Automated Processing Head Operation Procedure with Vision, according to Claim 6, characterized in that, in the first measurement phase, the reference points are taken via the video cameras (7) that form the vision equipment , on the workpiece to be processed in the zone near the area to be processed, taking a minimum of two points to draw a virtual line, or a minimum of three points to determine a reference plane, which includes - a first determination step of reference points, - a second stage of determining the positioning on the line or plane that the previously calculated reference points create, and - a third stage of forecasting or estimation, through the specific software (9) incorporated in the computer (8 ), the positioning errors that the robot (2) will make when it is directed to an intermediate point between the references taken and, thus, the final position can be corrected. [0008] 8. Automated Processing Head Operation Procedure with Vision, according to Claim 7, characterized in that the step of determining the reference points includes - a first operation in which the reference point 1 is measured with the vision equipment in the position 1 - a second operation in which the robot (2) repositiones itself in the new position 1, now position 2, by means of the measured data, - a third operation in which the processing head performs a translation / rotation, - a fourth operation in which it returns to position 2, - a fifth operation in which its reference point 1 is measured again, - a sixth operation in which the robot (2) repositiones itself in the new position 2, now position 3, by means of the measured data, - a seventh operation in which the processing head performs a translation / rotation, - an eighth operation in which it returns to position 3, and - a ninth operation in which reference point 1 is measured and stored as the checkpoint it, and these operations are repeated for each of the reference points. [0009] 9. Automated Processing Head Operation Procedure with Vision, according to Claim 7, characterized in that the step of determining the positioning in the line or plane that the previously calculated reference points create, includes - a first operation in which the distance between each two reference points is introduced, and - a second operation in which, through the specific software (9) incorporated in the computer (8), the corrections that have to be applied at intermediate points and / or close to points of the line or plane created by the reference points, determined by means of the real value of said reference points, are calculated. [0010] 10. Automated Processing Head Operation Procedure with Vision, according to Claim 6, characterized in that the second phase of positioning the head in the area to be processed comprises a step of moving the head, via the movement of the robot arm (1) controlled by the robot controller module (2), for the coordinates in which processing is required. [0011] 11. Automated Processing Head Operation Procedure with Vision, according to Claim 6, characterized in that the third phase of correction of the position of the head via vision comprises a first stage in which, in the case of normal surfaces that are not bright or polished, a reference image of the workpiece is taken by means of the video cameras (7) that form the vision equipment, through the side windows (11) of the machine pedal (3), which will be in their position open, analyzing its roughness through the specific software (9) incorporated in the computer (8) and locating the point of operation in relation to it. [0012] 12. Automated Processing Head Operation Procedure with Vision, according to Claim 6, characterized in that the third stage of correcting the position of the head via vision comprises a first step in which, in the case of very shiny or polished surfaces, the head itself makes a small mark or notch, which acts slightly with the processing tool (4) at the target point on the workpiece surface whose reference image will be taken via the video cameras (7) that form the view, through the side windows (11) of the machine pedal (3), which will be in its open position, before the application of additional forces, which identify it, by means of the image of the referred brand, as a reference. [0013] 13. Automated Processing Head Operation Procedure with Vision, according to Claim 6, characterized in that the third stage of correction of the head position by means of the vision includes - a second step of descending the machine pedal (3), by means of the vertical movement device (5), for the surface to be processed in which the consequent force exerted by the machine pedal (3) in the area to be processed, causes the movement of the robot arm (1), which involves a deviation from the originally requested position and orientation, implying a positioning error, followed by - a third stage in which the vision system, compared with the image obtained now by the video cameras (7) that form the vision equipment, through the windows side (11) of the machine pedal (3), which will remain in its open position, with the reference image obtained in the first step and which is used as a reference, generates a command for the robot arm (1) to move in the requested direction a, again taking another image of the surface to be processed, through the side windows (11) of the machine pedal (3), repeating this phase until the image coincides with the reference image, around the operating point, that is, until the coordinates of the current operating point coincide with those established in the second phase of positioning the head, and the orientation reached coincides with that required, which may be that of the reference obtained in the first stage, or simply the normal one for the surface at the point of operation. [0014] 14. Automated Processing Head Operation Procedure with Vision, according to Claim 6, characterized in that the fourth processing phase comprises - a first mechanical blocking step of the vertical movement device (5) of the machine pedal (3) , - a second stage of activating the covers (12) of the side windows (11) of the machine pedal (3), and - a third stage of the processing tool (4) located inside, to perform the surface processing. [0015] 15. Automated Processing Head Operation Procedure with Vision, according to Claim 6, characterized by including an optional previous calibration phase, which consists of using a calibration tool (14) to adjust the operation parameters of the head, in such a way that, in said calibration phase, the correlation is verified between the coordinate system of the processing tool, that of the vision system and that of the robot controller.
类似技术:
公开号 | 公开日 | 专利标题 BR112015028755B1|2021-02-02|automated processing head with vision and operation procedure of the head CN105881102B|2017-09-05|The positioner of the workpiece of shoot part is used JP4638327B2|2011-02-23|Parallel mechanism device, parallel mechanism device calibration method, calibration program, and recording medium CN103459102B|2016-01-06|Robot controller, robot control method, program and recording medium US7724380B2|2010-05-25|Method and system for three-dimensional measurement JP2012104136A|2012-05-31|Method for improving accuracy of machines US10184774B2|2019-01-22|Correcting apparatus and correcting method Liu et al.2018|Identification of position independent geometric errors of rotary axes for five-axis machine tools with structural restrictions JP6091826B2|2017-03-08|Processing device control device, processing device, and processing data correction method Chen et al.2016|A ballbar test for measurement and identification the comprehensive error of tilt table Lee et al.2017|Industrial robot calibration method using denavit—Hatenberg parameters KR102228835B1|2021-03-16|Industrial robot measuring system and method US20190152064A1|2019-05-23|Method for controlling an end element of a machine tool, and a machine tool US20200108506A1|2020-04-09|Device for acquiring a position and orientation of an end effector of a robot KR100301231B1|2001-11-05|The automatic compensation method for robot working path CN109968347B|2022-01-14|Zero calibration method of seven-axis robot Liu et al.2019|A line measurement method for geometric error measurement of the vertical machining center JP5351083B2|2013-11-27|Teaching point correction apparatus and teaching point correction method US20160077516A1|2016-03-17|Data compensation device, data compensation method, and machining apparatus KR100336459B1|2002-05-15|Method for off-line control of robot JP2015059897A|2015-03-30|Measuring device, processing device, measuring method, and processing method JP6403298B1|2018-10-10|NC machining apparatus and machining part manufacturing method Loser et al.2016|Real-time robot positioning based on measurement feedback control JP2016118928A|2016-06-30|Processing method, alignment mark component, and manufacturing method of component KR100476163B1|2005-03-15|Method of finding the reference coordinate system of object of automatic production line
同族专利:
公开号 | 公开日 EP2998080A4|2017-05-03| ES2671468T3|2018-06-06| CN105377513A|2016-03-02| EP2998080B1|2018-03-07| CA2912589A1|2014-11-20| ES2522921A1|2014-11-19| EP2998080A1|2016-03-23| US9919428B2|2018-03-20| KR102271941B1|2021-07-02| KR20160010868A|2016-01-28| US20160082598A1|2016-03-24| BR112015028755A2|2017-07-25| WO2014184414A1|2014-11-20| PT2998080T|2018-06-06| ES2522921B2|2015-07-30| CN105377513B|2018-05-15| CA2912589C|2021-06-08|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 JP2548027B2|1988-06-30|1996-10-30|ファナック株式会社|Arc vision sensor operation method| US5014183A|1988-10-31|1991-05-07|Cincinnati Milacron, Inc.|Method and means for path offsets memorization and recall in a manipulator| ES2036909B1|1990-04-18|1994-01-01|Ind Albajar S A|ARTIFICIAL VISION SYSTEM FOR ROBOTIZED FRUIT COLLECTION.| US5194791A|1990-07-19|1993-03-16|Mcdonnell Douglas Corporation|Compliant stereo vision target| US5615474A|1994-09-09|1997-04-01|Gemcor Engineering Corp.|Automatic fastening machine with statistical process control| KR100237302B1|1997-05-13|2000-01-15|윤종용|Method for controlling a touch sensor of welding robot| ES2142239B1|1997-08-04|2001-03-16|Univ Murcia|ROBOT WITH ARTIFICIAL VISION FOR AUTOMATIC SHOOTING OF SHOES.| ES2152171B1|1998-11-30|2001-08-01|Univ Madrid Carlos Iii|3D VISION SYSTEM WITH VIDEO SIGNAL HARDWARE PROCESSING.| FR2791916B1|1999-04-06|2001-05-04|Abb Preciflex Systems|METHOD FOR HOLDING A PART IN THE POSITION IN AN ASSEMBLY STATION| US6430472B1|1999-12-20|2002-08-06|Servo-Robot Inc.|Robot feature tracking devices and methods| WO2003004222A2|2001-07-02|2003-01-16|Microbotic A/S|Apparatus comprising a robot arm adapted to move object handling hexapods| US6855099B2|2001-10-31|2005-02-15|The Boeing Company|Manufacturing system for aircraft structures and other large structures| JP2005515910A|2002-01-31|2005-06-02|ブレインテック カナダ インコーポレイテッド|Method and apparatus for single camera 3D vision guide robotics| CA2369845A1|2002-01-31|2003-07-31|Braintech, Inc.|Method and apparatus for single camera 3d vision guided robotics| US7174238B1|2003-09-02|2007-02-06|Stephen Eliot Zweig|Mobile robotic system with web server and digital radio links| FI123306B|2004-01-30|2013-02-15|Wisematic Oy|Robot tool system, and its control method, computer program and software product| US7034262B2|2004-03-23|2006-04-25|General Electric Company|Apparatus and methods for repairing tenons on turbine buckets| ES2255386B1|2004-05-13|2007-10-01|Loxin 2002, S.L.|IMPROVED AUTOMATIC TOWING SYSTEM.| WO2006019970A2|2004-07-14|2006-02-23|Braintech Canada, Inc.|Method and apparatus for machine-vision| JP2006224291A|2005-01-19|2006-08-31|Yaskawa Electric Corp|Robot system| FR2897009B1|2006-02-07|2008-05-09|Alema Automation Soc Par Actio|METHOD FOR POSITIONING A TOOL ASSEMBLY AT THE END OF AN ARTICULATED ARM AND DEVICE FOR IMPLEMENTING IT| US7483151B2|2006-03-17|2009-01-27|Alpineon D.O.O.|Active 3D triangulation-based imaging method and device| CN101092034A|2006-06-20|2007-12-26|力晶半导体股份有限公司|Adjusting device for facility of handling wafers, and adjusting method for facility of handling wafers| FR2912672B1|2007-02-16|2009-05-15|Airbus France Sa|METHOD FOR ASSEMBLING TWO ASSEMBLIES, SUCH AS AIRCRAFT FUSELAGE ASSEMBLIES| DE102007041423A1|2007-08-31|2009-03-05|Abb Technology Ab|Robot tool, robot system and method for machining workpieces| CN101205662B|2007-11-26|2011-04-20|天津工业大学|Robot sewing system for three-dimensional composite material perform| DE102008042260B4|2008-09-22|2018-11-15|Robert Bosch Gmbh|Method for the flexible handling of objects with a handling device and an arrangement for a handling device| US8135208B1|2009-01-15|2012-03-13|Western Digital Technologies, Inc.|Calibrated vision based robotic system utilizing upward and downward looking cameras| EP2249286A1|2009-05-08|2010-11-10|Honda Research Institute Europe GmbH|Robot with vision-based 3D shape recognition| KR20100137882A|2009-06-23|2010-12-31|현대중공업 주식회사|Work trajectory modification method of industrial robot| CN101726296B|2009-12-22|2013-10-09|哈尔滨工业大学|Vision measurement, path planning and GNC integrated simulation system for space robot| KR20110095700A|2010-02-19|2011-08-25|현대중공업 주식회사|Industrial robot control method for workpiece object pickup| EP2711142B1|2012-09-20|2014-09-17|Comau S.p.A.|Industrial robot having electronic drive devices distributed on the robot structure|EP3083160A4|2013-12-17|2017-08-23|Syddansk Universitet|Device for dynamic switching of robot control points| EP3045989B1|2015-01-16|2019-08-07|Comau S.p.A.|Riveting apparatus| CN104708050B|2015-02-13|2017-03-29|深圳市圆梦精密技术研究院|Flexible drilling system| US9884372B2|2015-05-04|2018-02-06|The Boeing Company|Method and system for defining the position of a fastener with a peen mark| JP6665450B2|2015-08-31|2020-03-13|セイコーエプソン株式会社|Robot, control device, and robot system| CN105067046B|2015-09-15|2017-11-10|沈阳飞机工业(集团)有限公司|A kind of automatic drill riveter calibration method| JP6710946B2|2015-12-01|2020-06-17|セイコーエプソン株式会社|Controllers, robots and robot systems| US9981381B1|2016-06-08|2018-05-29|X Development Llc|Real time generation of phase synchronized trajectories| CN107876953A|2016-09-29|2018-04-06|福特环球技术公司|Jointing machine with position guidance system| KR101882473B1|2016-12-08|2018-07-25|한국생산기술연구원|An apparatus for inspecting the workpiece and a method for inspectin using the same| US10782670B2|2016-12-14|2020-09-22|The Boeing Company|Robotic task system| IT201700071176A1|2017-06-26|2018-12-26|Proge Tec S R L|AUTOMATIC RIVETING SYSTEM OF HANDLES ON POTS| CN107472909B|2017-07-31|2019-04-30|浩科机器人(苏州)有限公司|A kind of glass handling machine people with laser detection function| US10571260B2|2017-09-06|2020-02-25|The Boeing Company|Automated rivet measurement system| US10786901B2|2018-02-09|2020-09-29|Quanta Storage Inc.|Method for programming robot in vision base coordinate| CN108972623B|2018-07-27|2021-07-20|武汉理工大学|Robot tail end clamping error automatic correction method based on force control sensor| CN109211222A|2018-08-22|2019-01-15|扬州大学|High-accuracy position system and method based on machine vision| CN109352663B|2018-09-28|2020-11-20|航天材料及工艺研究所|Robot automatic accurate positioning hole making device and method for composite cabin section| CN111251290A|2018-11-30|2020-06-09|汉翔航空工业股份有限公司|System and method for compensating selectable path of mechanical arm| ES2788274A1|2019-04-17|2020-10-20|Loxin 2002 Sl|Machining head with active correction | CN110142372A|2019-06-14|2019-08-20|眉山中车紧固件科技有限公司|Rivet robot system| WO2021048579A1|2019-09-11|2021-03-18|Dmg森精機株式会社|System and machine tool| KR102281544B1|2019-11-18|2021-07-27|한국생산기술연구원|Hole processing method considering machining error of parts and hole processing device| ES2843739B2|2020-01-20|2021-11-24|Omicron 2020 Sl|ASSEMBLY PROCEDURE BASED ON A COLLABORATIVE ROBOTIC SYSTEM WITH INTERCHANGEABLE APPLICATIONS FOR AUTOMATED DRILLING, COUNTERSUNKING AND RIVETING OPERATIONS| US11192192B2|2020-02-10|2021-12-07|The Boeing Company|Method and apparatus for drilling a workpiece|
法律状态:
2018-11-06| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]| 2020-03-17| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]| 2020-12-08| B09A| Decision: intention to grant [chapter 9.1 patent gazette]| 2021-02-02| B16A| Patent or certificate of addition of invention granted|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 15/05/2014, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 ES201330713A|ES2522921B2|2013-05-17|2013-05-17|Head and automatic machining procedure with vision| ESP201330713|2013-05-17| PCT/ES2014/070403|WO2014184414A1|2013-05-17|2014-05-15|Head and automated mechanized method with vision| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|